New Neural Transfer Functions
نویسنده
چکیده
The choice of transfer functions in neural networks is of crucial importance to their performance. Although sigmoidal transfer functions are the most common there is no a priori reason why they should be optimal in all cases. In this article advantages of various neural transfer functions are discussed and several new type of functions are introduced. Universal transfer functions, parametrized to change from localized to delocalized type, are of greatest interest. Biradial functions are formed from products or linear combinations of two sigmoids. Products of N biradial functions in N -dimensional input space give densities of arbitrary shapes, offering great flexibility in modelling the probability density of the input vectors. Extensions of biradial functions, offering good tradeoff between complexity of transfer functions and flexibility of the densities they are able to represent, are proposed. Biradial functions can be used as transfer functions in many types of neural networks, such as RBF, RAN, FSM and IncNet. Using such functions and going into the hard limit (steep slopes) facilitates logical interpretation of the network performance, i.e. extraction of logical rules from the training data.
منابع مشابه
Predicting the Coefficients of Antoine Equation Using the Artificial Neural Network (TECHNICAL NOTE)
Neural network is one of the new soft computing methods commonly used for prediction of the thermodynamic properties of pure fluids and mixtures. In this study, we have used this soft computing method to predict the coefficients of the Antoine vapor pressure equation. Three transfer functions of tan-sigmoid (tansig), log-sigmoid (logsig), and linear were used to evaluate the performance of diff...
متن کاملTransfer functions: hidden possibilities for better neural networks
Sigmoidal or radial transfer functions do not guarantee the best generalization nor fast learning of neural networks. Families of parameterized transfer functions provide flexible decision borders. Networks based on such transfer functions should be small and accurate. Several possibilities of using transfer functions of different types in neural models are discussed, including enhancement of i...
متن کاملFlexible Transfer Functions with Ontogenic Neural Networks
Transfer functions play a very important role in learning process of neural systems. This paper presents new functions which are more flexible than other functions commonly used in artificial neural networks. The latest improvement added is the ability to rotate the contours of constant values of transfer functions in multidimensional spaces with only N − 1 adaptive parameters. Rotation using f...
متن کاملA new method for fuzzification of nested dummy variables by fuzzy clustering membership functions and its application in financial economy
In this study, the aim is to propose a new method for fuzzification of nested dummy variables. The fuzzification idea of dummy variables has been acquired from non-linear part of regime switching models in econometrics. In these models, the concept of transfer functions is like the notion of fuzzy membership functions, but no principle or linguistic sentence have been used for inputs. Consequen...
متن کاملTaxonomy of Neural Transfer Functions
The choice of transfer functions may strongly influence complexity and performance of neural networks used in classification and approximation tasks. A taxonomy of activation and output functions is proposed, allowing to generate many transfer functions. Several less-known types of transfer functions and new combinations of activation/output functions are described. Functions parameterize to ch...
متن کامل